# English language model

Marin 8b Instruct
Apache-2.0
Marin 8B is an open-source large language model with a scale of 8B parameters, developed based on the Llama architecture and supports English text generation tasks.
Large Language Model Safetensors English
M
marin-community
239
1
Falcon E 1B Instruct
Other
Falcon-E-1B-Instruct is an efficient language model based on a 1.58-bit architecture, optimized for edge devices with low memory footprint and high performance.
Large Language Model Transformers
F
tiiuae
87
7
Falcon E 3B Base
Other
Falcon-E is a 1.58-bit quantized language model developed by TII, featuring a pure Transformer architecture designed for efficient inference
Large Language Model Transformers
F
tiiuae
51
6
Olmo 2 0325 32B
Apache-2.0
OLMo 2 32B is the largest 32B-parameter model in the open language model series released by the Allen Institute for AI (AI2). It is open-sourced under the Apache 2.0 license and supports English language processing.
Large Language Model Transformers English
O
allenai
2,246
47
Yue S1 7B Anneal En Cot Exl2
Apache-2.0
A quantized version based on the m-a-p/YuE-s1-7B-anneal-en-cot model using Exllamav2, suitable for text generation tasks, particularly excelling in music-related fields.
Large Language Model English
Y
Doctor-Shotgun
94
10
Gpt1
MIT
A Transformer-based language model released by OpenAI, pre-trained on large-scale corpora with powerful text generation capabilities
Large Language Model Transformers English
G
lgaalves
310
5
Btlm 3b 8k Base
Apache-2.0
BTLM-3B-8k-base is a 3-billion-parameter language model with an 8k context length, trained on the 627-billion-token SlimPajama dataset, delivering performance comparable to open-source 7-billion-parameter models.
Large Language Model Transformers English
B
cerebras
2,078
262
Gpt2 Small
MIT
GPT-2 is an autoregressive language model based on the Transformer architecture. It is pre-trained on a large-scale English corpus through self-supervised learning and excels at text generation tasks.
Large Language Model Transformers English
G
ComCom
1,032
3
Large
Apache-2.0
A Transformer model pre-trained on English corpus using ELECTRA-like objective functions, learning intrinsic representations of English language through self-supervised methods.
Large Language Model Transformers English
L
funnel-transformer
190
2
Roberta Med Small 1M 1
A RoBERTa model pretrained on a small-scale dataset of 1M tokens, using the MED-SMALL architecture, suitable for text understanding tasks.
Large Language Model
R
nyu-mll
23
1
Distilbert Base Uncased Finetuned Cola
Apache-2.0
A lightweight text classification model based on DistilBERT, fine-tuned on the GLUE CoLA task for judging grammatical correctness of sentences
Text Classification Transformers
D
histinct7002
15
0
Transfo Xl Wt103
Transformer-XL is a causal Transformer architecture that uses relative position encoding. It can capture longer context by reusing previously computed hidden states, making it suitable for text generation tasks.
Text Generation Transformers English
T
transfo-xl
4,498
15
Gpt Neo 2.7B
MIT
GPT-Neo 2.7B is a 2.7 billion parameter Transformer language model replicated by EleutherAI based on the GPT-3 architecture, trained on the Pile dataset
Large Language Model English
G
EleutherAI
52.68k
486
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase